-
Notifications
You must be signed in to change notification settings - Fork 2.4k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
ONNX Support #3562
ONNX Support #3562
Conversation
Update the text to imaeg and image to image graphs to work with the new model loader. Currently only supports 1.x models. Will update this soon to make it work with all models.
Co-Authored-By: StAlKeR7779 <[email protected]>
Basically updated all slices to be more descriptive in their names. Did so in order to make sure theres good naming scheme available for secondary models.
This reverts commit e0c105f.
So the long names do not get cut off.
…ms to model schema(as they not not referenced in case of Literal definition)
Correct. It was an olive-optimized model. |
Comments:
Overall very positive experience. Just need a better step-by-step guide for people like me who are coming to the ONNX architecture from basically zero. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Needs img2img and canvas support and a better step-by-step guide. Overall, though, looks great.
Thanks for checking it out! I updated the installer to install onnxruntime/onnxruntime-gpu/onnxruntime-directml through optional dependencies based on their elections for different combinations of GPUs and Driver installations. Do you think updating the Readme would suffice telling people they need to install the correct onnxruntime for their environment? Also yeah, I currently have it setup to only allow onnxruntime for linear text to image and node-editor. It's disabled on all other screens. I was thinking it would be good to go ahead and get it out with minimal functionality and slowly roll out more features as we move forward in separate PRs rather than maintaining this as it gets bigger. Definitely open to a discussion around this though. I'll definitely work on better documentation for how to use ONNX in workflows, definitely a weak spot in this PR! |
Merged after discussion w/ LStein & Brandon in discord. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
There are a number of issues that need to be fixed.
Also, the PR was merged with 9 errors on the frontend check, there are a number of typing problems. I've added comments for each problem.
@@ -61,6 +61,7 @@ dependencies = [ | |||
"numpy", | |||
"npyscreen", | |||
"omegaconf", | |||
"onnx", |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
- I had to install this after updating to
main
in order for the app to run - then I also had to install
onnxruntime
manually, I guess becauseonnx
was installed, otherwise the app did not start up
I did not ask for onnx
. So I'm not sure the optional dependencies are working as expected.
...d/web/src/features/ui/components/tabs/ModelManager/subpanels/ModelManagerPanel/ModelList.tsx
Show resolved
Hide resolved
export const zMainModel = z.object({ | ||
model_name: z.string().min(1), | ||
base_model: zBaseModel, | ||
model_type: zModelType, | ||
}); | ||
|
||
/** | ||
* Type alias for model parameter, inferred from its zod schema | ||
*/ | ||
export type MainModelParam = z.infer<typeof zMainModel>; | ||
|
||
/** | ||
* Type alias for model parameter, inferred from its zod schema | ||
*/ | ||
export type OnnxModelParam = z.infer<typeof zMainModel>; |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Schemas with single valid values for a property should just have the literal there.
If the main model schema needs model_type
now, it should have model_type: z.literal("main")
. Same for onnx model schema. Otherwise the client will accept an onnx or sd main model in either situation.
invokeai/frontend/web/src/features/nodes/util/graphBuilders/buildLinearTextToImageGraph.ts
Show resolved
Hide resolved
invokeai/frontend/web/src/features/nodes/util/graphBuilders/buildLinearTextToImageGraph.ts
Show resolved
Hide resolved
...d/web/src/features/ui/components/tabs/ModelManager/subpanels/ModelManagerPanel/ModelList.tsx
Show resolved
Hide resolved
invokeai/frontend/web/src/features/nodes/util/graphBuilders/buildCanvasTextToImageGraph.ts
Show resolved
Hide resolved
invokeai/frontend/web/src/features/nodes/util/graphBuilders/buildLinearTextToImageGraph.ts
Show resolved
Hide resolved
This error shouldn't be there if I am trying to run the CPU version. When I try to install the CPU specific toolsets still the error is seen |
Note: this branch based on #3548, not on main
While find out what needs to be done to implement onnx, found that I can do draft of it pretty quickly, so... here it is)
Supports LoRA and TI.
As example - cat with sadcatmeme lora: